Three-Tier Accessibility Strategy
**Goal:** Comprehensive WCAG 2.1 AA compliance through 40/60 automated/manual split
Overview
┌─────────────────────────────────────────────────────────────┐
│ THREE-TIER STRATEGY │
├─────────────────────────────────────────────────────────────┤
│ Tier 1: Unit Tests │ 40% automated │ jest-axe │
│ Tier 2: E2E Scans │ 30% automated │ @axe-core │
│ Tier 3: Manual Audits │ 30% manual │ Human checks │
└─────────────────────────────────────────────────────────────┘**Purpose:** Achieve WCAG 2.1 AA compliance through a balanced approach combining automated testing (70%) and manual accessibility audits (30%). This strategy acknowledges that while automated tools can catch many common accessibility issues, comprehensive compliance requires human testing for screen reader compatibility, keyboard navigation flows, and edge cases.
**Legal Context:** WCAG 2.1 AA compliance is becoming mandatory across many jurisdictions by 2026-2027. Non-compliance may result in legal action, fines, and exclusion of users with disabilities.
Tier 1: Component-Level Unit Tests (40% Automated)
**Tool:** jest-axe with Vitest
**What it covers:** ARIA attributes, semantic HTML, component accessibility
**When it runs:** Every PR (fast feedback)
**Limitations:** Cannot test interaction flows, dynamic content, keyboard navigation
Test Coverage
- **ARIA Attributes:** Labels, roles, properties, states
- **Semantic HTML:** Proper heading hierarchy, landmarks, lists
- **Form Accessibility:** Label associations, required fields, error messaging
- **Component Structure:** Buttons, dialogs, inputs, menus, tabs
What Gets Caught
- Missing aria-labels on icon-only buttons
- Improper heading hierarchy (h1 through h6)
- Missing alt text on images
- Unlabeled form inputs
- Broken ARIA role/attribute combinations
- Invalid HTML structure
What Gets Missed
- Screen reader announcement quality (is it understandable?)
- Keyboard navigation order (does tab flow make sense?)
- Focus management (where does focus go after modal closes?)
- Dynamic content announcements (does user know content changed?)
- Complex interaction patterns (drag-drop, custom widgets)
Files
- **Test Location:**
src/components/ui/**/__tests__/*.a11y.test.tsx - **Test Setup:**
vitest.setup.ts(jest-axe configuration) - **Coverage:** 44 component tests (as of Phase 108-02)
Running Tier 1 Tests
# Run all component accessibility tests
npm run test:a11y:unit
# Run specific component test
npx vitest src/components/ui/button/__tests__/button.a11y.test.tsx
# Run with coverage
npm run test:a11y:unit -- --coverageExample Test
import { axe, toHaveNoViolations } from 'jest-axe';
import { render } from '@testing-library/react';
import { Button } from '../button';
expect.extend(toHaveNoViolations);
describe('Button accessibility', () => {
it('should not have accessibility violations', async () => {
const { container } = render(<Button>Click me</Button>);
const results = await axe(container);
expect(results).toHaveNoViolations();
});
it('should have aria-label when using icon-only button', async () => {
const { container } = render(<Button icon={<XIcon />} />);
const results = await axe(container);
expect(results).toHaveNoViolations();
});
});Tier 2: E2E Accessibility Scans (30% Automated)
**Tool:** @axe-core/playwright with Playwright
**What it covers:** Page-level accessibility, critical user paths
**When it runs:** Every PR, before releases
**Limitations:** Cannot test screen reader experience, some keyboard flows
Test Coverage
- **Critical User Paths:**
- Authentication (login, signup, password reset)
- Agent management (create, configure, monitor)
- Canvas/Skills marketplace
- Billing/Usage dashboards
- Admin tenant/user management
- **Accessibility Tree:**
- ARIA attribute verification in full page context
- Heading hierarchy across entire page
- Landmark regions (header, nav, main, footer)
- Focusable elements order
- **WCAG 2.1 AA Rules:**
- Color contrast (4.5:1 for text, 3:1 for UI components)
- Alt text on images
- Form label associations
- Link purpose (descriptive text or aria-label)
- Duplicate IDs (must be unique)
What Gets Caught
- Page-level ARIA issues
- Color contrast violations
- Missing page titles
- Broken skip navigation links
- Duplicate IDs
- Low contrast in interactive elements
What Gets Missed
- Screen reader behavior (is announcement helpful?)
- Keyboard-only usability (can user complete task?)
- Mobile touch target size (44x44px minimum)
- Orientation and resize handling
- Custom component keyboard support
Files
- **E2E Tests:**
tests/e2e/a11y.spec.ts,tests/e2e/keyboard-navigation.spec.ts,tests/e2e/color-contrast.spec.ts - **Test Results:**
test-results/a11y-results.json - **Violation Report:**
docs/accessibility/VIOLATION_REPORT.md
Running Tier 2 Tests
# Run all E2E accessibility tests
npm run test:a11y
# Run with JSON output for report generation
npm run test:a11y:ci
# Run keyboard navigation tests
npm run test:a11y:keyboard
# Run color contrast tests
npm run test:a11y:contrast
# Run full E2E accessibility suite
npm run test:a11y:fullExample Test
import { test, expect } from '@playwright/test';
test.describe('Authentication accessibility', () => {
test('login page should not have accessibility violations', async ({ page }) => {
await page.goto('/login');
// Run axe-core scan
const accessibilityScanResults = await page.accessibility.scan();
expect(accessibilityScanResults.violations).toEqual([]);
});
test('login form should be keyboard navigable', async ({ page }) => {
await page.goto('/login');
// Tab through form
await page.keyboard.press('Tab'); // Email input
await page.keyboard.press('Tab'); // Password input
await page.keyboard.press('Tab'); // Submit button
// Verify focus order
const focusedElement = await page.evaluate(() => document.activeElement?.tagName);
expect(focusedElement).toBe('BUTTON');
});
});Tier 3: Manual Audits (30% Manual)
**Tools:** NVDA (Windows), VoiceOver (macOS), keyboard-only testing
**What it covers:** Screen reader experience, edge cases, mobile accessibility
**When it runs:** Monthly, before major releases, when introducing new UI patterns
**Limitations:** Time-intensive, requires accessibility expertise
Test Coverage
- **Screen Reader Compatibility:**
- NVDA (Windows) + Firefox/Chrome
- VoiceOver (macOS) + Safari/Chrome
- Announcement quality and timing
- Navigation efficiency
- **Keyboard-Only Navigation:**
- Can user complete tasks without mouse?
- Focus visible on all interactive elements?
- Tab order matches visual order?
- Escape key behavior consistent?
- **Mobile Accessibility:**
- Touch target size (44x44px minimum)
- Orientation changes handled gracefully
- Pinch-zoom not disabled (unless essential)
- Screen orientation not locked
- **Dynamic Content:**
- Modal open/close announcements
- Loading states (aria-live)
- Page updates (live regions)
- Error messages (role="alert")
- **Complex Components:**
- Drag-and-drop interfaces
- Custom dropdowns/menus
- Data tables (sorting, filtering)
- Multi-step wizards
What Gets Caught
- Screen reader confusion (announcements unclear)
- Keyboard traps (can't exit with keyboard)
- Focus management issues (focus lost after action)
- Mobile touch targets too small
- Dynamic content changes not announced
- Complex widget keyboard support missing
What Gets Missed (Requires User Testing)
- Actual user efficiency (how long does task take?)
- Cognitive load (is interface overwhelming?)
- Alternative input methods (switch devices, eye tracking)
- Real-world assistive technology combinations
Files
- **Procedures:**
docs/accessibility/SCREEN_READER_TESTING.md - **Checklist:**
docs/accessibility/MANUAL_TESTING_CHECKLIST.md - **Schedule:** Quarterly audits (see Execution Flow)
Running Tier 3 Tests
See docs/accessibility/SCREEN_READER_TESTING.md for detailed procedures:
# Before manual audit, run automated suite first
npm run test:a11y:full
# Generate violation report to guide manual testing
npm run a11y:report
# Follow manual testing checklist
# See: docs/accessibility/MANUAL_TESTING_CHECKLIST.mdExecution Flow
For Each Pull Request (Automated Only)
**Time:** <2 minutes
**Trigger:** Every PR to main branch
**Execution:** GitHub Actions workflow
# Tier 1: Component unit tests (40% coverage)
npm run test:a11y:unit
# Tier 2: E2E accessibility scans (30% coverage)
npm run test:a11y:ci
# Fail PR on critical violations
# Generate report artifact**What happens:**
- PR opened/updated
- GitHub Actions workflow triggered
- Tier 1 runs (fast, ~30 seconds)
- Tier 2 runs (slower, ~90 seconds)
- Report generated if violations found
- PR fails if critical violations detected
- Developer reviews and fixes violations
For Releases (Automated + Manual)
**Time:** 2-4 hours (including manual testing)
**Trigger:** Before major release, monthly
**Execution:** Manual + automated
# Step 1: Run full automated suite
npm run test:a11y:full
# Step 2: Generate violation report
npm run a11y:report
# Step 3: Manual screen reader testing
# Follow: docs/accessibility/SCREEN_READER_TESTING.md
# - NVDA testing (Windows)
# - VoiceOver testing (macOS)
# Step 4: Manual keyboard-only testing
# Follow: docs/accessibility/MANUAL_TESTING_CHECKLIST.md
# - Complete critical paths without mouse
# - Verify focus management
# - Test escape key behavior
# Step 5: Remediate violations
# Priority: critical > serious > moderate
# Track in GitHub issues with `a11y` label
# Step 6: Re-verify fixes
# Run automated suite again
# Confirm manual testing passes**What happens:**
- Automated suite runs (catches 70% of issues)
- Violation report guides manual testing focus
- Manual testing catches remaining 30%
- Issues documented and prioritized
- Fixes implemented and verified
- Release approved when critical/serious issues resolved
Monthly Audits (Manual Focus)
**Time:** 4-8 hours
**Trigger:** First Monday of each month
**Execution:** Manual testing + documentation
# Step 1: Full manual audit
# - Screen reader testing (NVDA + VoiceOver)
# - Keyboard-only navigation (all critical paths)
# - Mobile touch target verification
# - Dynamic content announcements
# Step 2: Update accessibility debt tracking
# - Document new violations found
# - Track remediation progress
# - Identify patterns for developer training
# Step 3: Review and improve test coverage
# - Add missing automated tests for recurring issues
# - Update jest-axe component tests
# - Extend E2E test coverage for edge cases
# Step 4: Developer training (if needed)
# - Share lessons learned
# - Update accessibility guidelines
# - Add code review checklist itemsCoverage Targets
| Tier | Target | Current | Gap | Notes |
|---|---|---|---|---|
| Tier 1 (Unit) | 40% | 44 tests | - | Component-level jest-axe tests |
| Tier 2 (E2E) | 30% | 51 tests | - | @axe-core page scans |
| Tier 3 (Manual) | 30% | Quarterly | - | Screen reader + keyboard |
| **Total Automated** | **70%** | **95 tests** | - | T1 + T2 combined |
| **Total Manual** | **30%** | **Scheduled** | - | Tier 3 audits |
**Notes:**
- Automated coverage (T1 + T2) = 95 tests across components and E2E paths
- Manual coverage (T3) = Quarterly full audits + new UI patterns
- Target: Catch 70% of WCAG violations automatically, 30% through manual audits
Success Criteria
Automated Testing (Tiers 1 & 2)
- [x] Tier 1: Component-level jest-axe tests for all UI components (40% coverage)
- [x] Tier 2: E2E @axe-core scans for all critical user paths (30% coverage)
- [x] CI/CD runs accessibility tests on every PR
- [x] Report generation for violation tracking
- [ ] 95%+ pass rate on automated tests (current: TBD)
- [ ] <5% false positive rate (acceptable level)
Manual Testing (Tier 3)
- [ ] Quarterly screen reader audits completed (NVDA + VoiceOver)
- [ ] Keyboard-only testing for all critical paths
- [ ] Mobile accessibility verified (touch targets, orientation)
- [ ] Manual testing checklist maintained and updated
- [ ] Violation tracking shows remediation progress
Compliance
- [ ] Zero critical accessibility violations in production
- [ ] <5 serious violations per quarter (acceptable threshold)
- [ ] Moderate violations addressed within 1 quarter
- [ ] WCAG 2.1 AA compliance verified annually
- [ ] Accessibility debt decreasing over time
Documentation
- [x] Three-tier strategy documented
- [x] Screen reader testing procedures available
- [x] Violation reporting guidelines established
- [x] Manual testing checklist maintained
- [x] Developer accessibility guidelines shared
Related Documentation
- **Screen Reader Testing:**
docs/accessibility/SCREEN_READER_TESTING.md - NVDA (Windows) commands and test scenarios
- VoiceOver (macOS) commands and test scenarios
- Critical user paths for testing
- Common issues and fixes
- **Manual Testing Checklist:**
docs/accessibility/MANUAL_TESTING_CHECKLIST.md - Keyboard navigation tests
- Screen reader tests
- Mobile accessibility tests
- Dynamic content tests
- **Violation Reporting:**
docs/accessibility/VIOLATION_REPORTING.md - Standardized violation format
- Severity levels (critical, serious, moderate)
- WCAG criteria references
- Remediation guidance with code examples
- **Color Contrast:**
docs/accessibility/COLOR_CONTRAST_GUIDELINES.md - WCAG AA contrast requirements (4.5:1 text, 3:1 UI)
- Testing procedures
- Common violations and fixes
Continuous Improvement
Monthly
- Review automated test results for false positives
- Update jest-axe tests for new components
- Extend E2E coverage for new features
- Document lessons learned
Quarterly
- Full manual audit (screen readers + keyboard)
- Review accessibility debt trends
- Identify patterns for developer training
- Update testing strategy based on findings
Annually
- Comprehensive WCAG 2.1 AA compliance audit
- User testing with people with disabilities
- Strategy review and update for next year
- Training refresh for development team
---
**Strategy Version:** 1.0
**Last Updated:** 2026-03-22
**Next Review:** 2026-06-22 (Quarterly)
**Owner:** Development Team + QA
**For questions or improvements, see:**
docs/accessibility/SCREEN_READER_TESTING.md(procedures)docs/accessibility/VIOLATION_REPORTING.md(reporting)docs/accessibility/MANUAL_TESTING_CHECKLIST.md(checklist)